The history of malaria predates humanity, as this ancient disease evolved before humans did. Malaria, a widespread and potentially lethal infectious disease, has afflicted people for much of human history, and has affected settlement patterns. The prevention and treatment of the disease have been investigated in science and medicine for hundreds of years, and, since the discovery of the parasite which causes it, attention has focused on its biology. These studies have continued up to the present day, since no effective Malaria vaccine has yet been developed and many of the older antimalarial drugs are losing effectiveness as the parasite evolves high levels of drug resistance. As malaria remains a major public health problem, causing 250 million cases of fever and approximately one million deaths annually, understanding its history is key.
Contents |
Human malaria likely originated in Africa and has coevolved along with its hosts, mosquitoes and non-human primates. The first evidence of malaria parasites was found in mosquitoes preserved in amber from the Palaeogene period that are approximately 30 million years old.[1] Malaria may have been a human pathogen for the entire history of the species.[2][3] Humans may have originally caught Plasmodium falciparum from gorillas.[4] About 10,000 years ago malaria started having a major impact on human survival which coincides with the start of agriculture (Neolithic revolution);[5] a consequence was natural selection for sickle-cell disease, thalassaemias, glucose-6-phosphate dehydrogenase deficiency, ovalocytosis, elliptocytosis and loss of the Gerbich antigen (glycophorin C) and the Duffy antigen on the erythrocytes because such blood disorders confer a selective advantage against malaria infection (balancing selection).[6] The three major types of inherited genetic resistance (sickle-cell disease, thalassaemias, and glucose-6-phosphate dehydrogenase deficiency) were present in the Mediterranean world by the time of the Roman Empire, about 2000 years ago.[7]
References to the unique periodic fevers of malaria are found throughout recorded history. According to legend, the Chinese emperor Huang Di (Yellow Emperor, 2697–2590 BCE) ordered the compilation of a canon of internal medicine. The Chinese Huangdi Neijing (The Inner Canon of the Yellow Emperor) apparently refers to repeated paroxysmal fevers associated with enlarged spleens and a tendency to epidemic occurrence – the earliest written report of malaria.[9] In the Sushruta Samhita, a Sanskrit medical treatise (6th century BCE), the symptoms of malarial fever were described and attributed to the bites of certain insects.[10]
The term 'miasma' was coined by Hippocrates of Kos who used it to describe dangerous fumes from the ground that are transported by winds and can cause serious illnesses.[11] The name malaria, derived from ‘mal’aria’ (bad air in Medieval Italian) was probably first used by Leonardo Bruni in a publication of 1476.[12] This idea came from the Ancient Romans who thought that this disease came from the horrible fumes from the swamps. The idea that the disease came from the foul gasses released from soil, water and air persisted throughout the nineteenth century.[13]
Malaria was once common in most of Europe and North America, where it is now for all purposes non-existent. The coastal plains of southern Italy, for example, fell from international prominence (the Crusaders going by sea to the Holy Land took ship at Bari) when malaria expanded its reach in the sixteenth century. At roughly the same time, in the coastal marshes of England, mortality from "marsh fever" or "tertian ague" ("the ague" from Latin "febris acuta") was comparable to that in sub-Saharan Africa today.[14][15] William Shakespeare was born at the start of the especially cold period that climatologists call the "Little Ice Age", yet he was aware enough of the ravages of the disease to mention it in eight of his plays.[16] Throughout history the most critical factors in the spread or eradication of disease have been human behavior (shifting population centers, changing farming methods and the like) and living standards. Precise statistics do not exist because many cases occur in rural areas where people do not have access to hospitals or the means to afford health care. As a consequence, the majority of cases are undocumented.[17] Poverty has been and remains a reason for the disease to remain while it has undergone a decline in other locations.[18] Climate change is likely to affect future trends in malaria transmission, but the severity and geographic distribution of such effects is currently uncertain, though attracting increasing scientific attention.[19]
The introduction of molecular methods confirmed the high prevalence of P.falciparum malaria in ancient Egypt.[21][22] The historian Herodotus (484–425 BCE) wrote that the builders of the Egyptian pyramids were given large amount of garlic,[23] likely to protect them against malaria. Sneferu, the founder of the Fourth dynasty of Egypt, who reigned from around 2613 - 2589 BCE, used bed-nets as protection against mosquitoes, Cleopatra VII, the last Pharaoh of Ancient Egypt, also slept under a mosquito net.[24] Malaria became widely recognized in ancient Greece by the 4th century BCE, and is implicated in the decline of many city-state populations. Hippocrates (460–370 BCE), the "father of medicine", related the presence of intermittent fevers with climatic and environmental conditions and classified the fever according to periodicity: tritaios pyretos / febris tertiana, and tetrataios pyretos / febris quartana (every fourth day).[25][26]
For thousands of years, traditional herbal remedies have been used to treat malaria.[27] Around 168 BCE the herbal remedy Qing-hao (Artemisia annua) came into use in China to treat female hemorrhoids (Recipes for 52 kinds of diseases unearthed from the Mawangdui tombs).[20]
Qinghao was first recommended for acute intermittent fever episodes by Ge Hong as an effective medication in the 4th century Chinese manuscript Zhou hou bei ji fang, usually translated as "Emergency Prescriptions kept in one's Sleeve".[29] His recommendation was to soak fresh plants of the artemisia herb in cold water, wring it out and ingest the expressed bitter juice in its raw state.[30][31]
Medical accounts and ancient autopsy reports state that tertian malarial fevers caused the death of four members of the Medici family of Florence: Eleonora of Toledo (1522–1562), Cardinal Giovanni (1543–1562), Don Garzia (1547–1562) and Grand Duke Francesco I (1531–1587). These claims have been reexamined with more modern methologies.[32] These methods have confirmed the presence of P.falciparum in the remains confirming the original diagnosis.
Treatment of malaria was discussed in several European herbal texts during the Renaissance including Otto Brunfels (1532), Leonhart Fuchs (1543), Adam Lonicer (1560), Hieronymus Bock (1577), Pietro Andrea Mattioli (1590), and Theodor Zwinger (1696).[33]
European settlers and their West African slaves likely brought malaria to the Americas in the 16th century.[34][35] Spanish missionaries found that fever was treated by Amerindians near Loxa (Peru) with powder from Peruvian bark (Cinchona succirubra).[36] There are no references to malaria in the "medical books" of the Mayans or Aztecs. Quinine (Kinine), a toxic plant alkaloid, is an effective muscle relaxant, as the modern use for nocturnal leg cramps suggests,[37] long used by the Quechua Indians of Peru to reduce the shaking effects caused by severe chills in the Andes.[38] The Jesuit Brother Agostino Salumbrino (1561–1642), an apothecary by training and who lived in Lima, observed the Quechua using the quinine-containing bark of the cinchona tree for that purpose. While its effect in treating malaria (and hence malaria-induced shivering) was entirely unrelated to its effect in controlling shivering from cold, it was nevertheless the correct medicine for malaria. The use of the “fever tree” bark was introduced into European medicine by Jesuit missionaries (Jesuit's bark).[39][40] Jesuit Barnabé de Cobo (1582–1657), who explored Mexico and Peru, is credited with taking cinchona bark to Europe. He brought the bark from Lima to Spain, and afterwards to Rome and other parts of Italy, in 1632. Francesco Torti published in 1712 that only “intermittent fever” was amenable to the fever tree bark (“Therapeutice Specialis ad Febres Periodicas Perniciosas”, 1712 Modena). This work finally established the specific nature of cinchona bark and brought about its general use in medicine.[41] In 1717, the graphite pigmentation of a postmortem spleen and brain was published by Giovanni Maria Lancisi in his malaria text book “De noxiis paludum effluviis eorumque remediis”. He related the prevalence of malaria in swampy areas to the presence of flies and recommended swamp drainage to prevent it.[42]
Quinine
Pierre Joseph Pelletier and Joseph Bienaimé Caventou separated in 1820 the alkaloids Cinchonine and Quinine from powdered fever tree bark, allowing for the creation of standardized doses of the active ingredients.[43][44] Prior to 1820, the bark was first dried, ground to a fine powder and then mixed into a liquid (commonly wine) which was then drunk.[45]
An English trader, Charles Ledger, and his Amerindian servant, Manuel Incra Mamani, had spent four years collecting cinchona seeds in the Andes in Bolivia, highly prized for their quinine but a prohibited export. Ledger managed to get some seeds out; in 1865 the Dutch government bought a small parcel, and 20,000 trees of the famous Cinchona ledgeriana were successfully cultivated in Java (Indonesia). By the end of the nineteenth century the Dutch had established a world monopoly in the supply of quinine.[46]
'Warburg's Tincture'
In 1834, in British Guiana (now Guyana), a German physician, Carl Warburg, invented an antipyretic medicine: 'Warburg's Tincture'. This secret, proprietary remedy contained quinine and a number of different other herbs. Trials were made in Europe in the 1840s and 1850s, and it was officially adopted by the Austrian Empire in 1847. Warburg's Tincture gained a high, international reputation. It was considered by many eminent medical professionals to be a more efficacious antimalarial drug than quinine. It was also more economical. The British Government supplied Warburg's Tincture to troops in India and other colonies.[47][48]
Synthetic drugs
William Henry Perkin, a student of August Wilhelm von Hofmann at the Royal College of Chemistry in London, tried in the 1850s to synthesize quinine in a commercially practicable process. The idea was to take two equivalents of N-allyltoluidine (C10H13N) and three atoms of oxygen to produce quinine (C20H24N2O2) and water. The experiments were unsuccessful. However, Perkin's Mauve was produced when attempting quinine total synthesis via the oxidation of N-allyltoluidine.[49] Before Perkin's discovery all dyes and pigments were derived from roots, leaves, insects, or, in the case of Tyrian purple, molluscs. Perkin's discovery of artificially synthesized dyes led to important advances in medicine, photography, and many other fields.
In 1891 Paul Guttmann and Paul Ehrlich noted that methylene blue had a high affinity for some tissues and that this dye had a slight antimalarial property.[50] MB and its congeners may act by preventing the biocrystallization of heme.[51] A mixture of eosin Y, methylene blue and demethylated methylene blue (azure B) was later used for a number of different blood film staining procedures (Malachowski stain, Romanowsky stain, Giemsa stain).[52][53] Ehrlich, the founder of chemotherapy, advocated a rational development of drugs by exploiting biochemical differences (“magic bullets”).[54]
Johann Heinrich Meckel[56] recorded in 1848 innumerable black-brown pigment granules in the blood and spleen of a patient who had died in a hospital for insane people. Meckel was probably looking at the parasites of malaria without realizing it; malaria was not mentioned in his report. He thought the pigment was melanin.[57] The causal relationship of pigment to the parasite was established in 1880, when the French physician Charles Louis Alphonse Laveran, working in the military hospital of Constantine Algeria, observed pigmented parasites inside the red blood cells of people suffering from malaria. He also witnessed the events of exflagellation and became convinced that the moving flagella were parasitic microorganisms. He noted that quinine removed the parasites from the blood. Laveran called this microscopic organism Oscillaria malariae and proposed that malaria was caused by this protozoan.[58] This discovery was not initially well received and remained controversial until the development of the oil immersion lens in 1884 and of superior staining methods in 1890-1891.
In 1885 Ettore Marchiafava, Angelo Celli and Camillo Golgi studied the reproduction cycles in human blood (Golgi cycles). Golgi observed that all parasites present in the blood divided almost simultaneously at regular intervals and that division coincided with attacks of fever. Golgi also recognized that the three types of malaria are caused by different protozoan organisms. By 1890 Laveran's germ was generally accepted but most of Laveran's initial ideas had been discarded in favor of the taxonomic work and clinical pathology of the Italian school. Marchiafava and Celli called the new microorganism Plasmodium.[59] Pel, presaging its discovery by over 50 years, proposed the first theory of the existence of a tissue stage of the malaria parasite in 1886. This suggestion was reiterated in 1893 when Golgi also suggested that the parasites might have an undiscovered tissue phase this time in endothelial cells.[60] Pel in 1896 supported Gogli's latent phase theory.[61] Also in 1886 Golgi described the morphological differences that are still used to distinguish two malaria parasite species Plasmodium vivax and Plasmodium malariae. Shortly after this Sakharov in 1889 and Marchiafava & Celli in 1890 independently identified Plasmodium falciparum as a species distinct from P. vivax and P. malariae. In 1890, Grassi and Feletti reviewed the available information and named both P. malariae and P. vivax with the following statement: "C'est pour cela que nous distinguons, dans le genre Haemamoeba, trois espèces (H. malariae de la fièvre quarte, H. vivax de la fièvre tierce et H. praecox de la fièvre quotidienne avec coutres intermittences etc.)."[62] H. vivax was soon renamed Plasmodium vivax. In 1892 Marchiafava and Bignami proved that the multiple forms seen by Laveran are a single species. This species was eventually named P. falciparum.
Laveran was awarded the 1907 Nobel Prize for Physiology or Medicine "in recognition of his work on the role played by protozoa in causing diseases".[63] In 1897 the sexual stages of a related haematozoan, Haemoproteus columbae, in the blood were discovered by William MacCallum in infected birds.[58]
Giovanni Maria Lancisi, John Crawford,[64] Patrick Manson,[65] Josiah C. Nott, Albert Freeman Africanus King,[66] and Charles Louis Alphonse Laveran developed theories that malaria may be caused by mosquito bites, but there was little evidence to support this idea. An early effort at malaria prevention occurred in 1896. An Uxbridge malaria outbreak prompted health officer, Dr. Leonard White, to write a report to the State Board of Health, which led to a study of mosquito-malaria links, and the first efforts for malaria prevention.[67] Massachusetts State pathologist, Theobald Smith, asked that White's son collect mosquito specimens for further analysis, and that citizens 1) add screens to windows, and 2) drain collections of water.[67]
It was Britain's Sir Ronald Ross, an army surgeon working in Secunderabad India, who proved in 1897 that malaria is transmitted by mosquitoes. He was able to find pigmented malaria parasites in a mosquito that he artificially fed on a malaria patient named Hussain Khan, who had crescents in his blood. He continued his research into malaria by showing that certain mosquito species (Culex fatigans) transmit malaria to sparrows and isolated malaria parasites from the salivary glands of mosquitoes that had fed on infected birds.[68] He reported this to the British Medical Association in Edinburgh, Scotland in 1898 and was greeted with a standing ovation.
Giovanni Battista Grassi, professor of Comparative Anatomy at Rome University, showed that human malaria could only be transmitted by Anopheles (Greek "anofelís": good-for-nothing) mosquitoes.[69] Grassi along with his coworkers Amico Bignami, Giuseppe Bastianelli and Ettore Marchiafava announced at the session of the Accademia dei Lincei on December 4, 1898 that a healthy man in a non-malarial zone had contracted tertian malaria after being bitten by an experimentally infected Anopheles claviger.
In 1899 Bastianelli and Bignami were the first to observe the complete P. vivax transmission cycle from mosquito to human and back.
A bitter dispute broke out between the British and Italian schools of malariology over priority but Ross received the 1902 Nobel Prize for Physiology or Medicine for "his work on malaria, by which he has shown how it enters the organism and thereby has laid the foundation for successful research on this disease and methods of combating it".[70] In Cuba Carlos Finlay found that yellow fever is transmitted by another type of mosquito, now Aedes aegypti. His observation was confirmed by a medical board headed by Walter Reed. Yellow fever and malaria among workers had seriously delayed construction of the Panama Canal. Mosquito control instituted by William C. Gorgas dramatically reduced this problem.[71] Relapses were first noted in 1897 by Thayer who recounted the experiences of a physician who suffered a relapse of malaria twenty one months after leaving an endemic area.[72] He proposed the existence of a tissue stage.
In 1876 methylene blue was synthesized by Heinrich Caro at BASF, a German chemical company.[73] Robert Koch in 1882 used this dye to discover the cause of tuberculosis - Mycobacterium tuberculosis. Paul Ehrlich in 1880 described the use of "neutral" dyes - mixtures of acidic and basic dyes for the differentiation of cells in peripheral blood smears. In 1886 Bernthsen prepared a relatively pure dye, obtained by decomposition of methylene blue which he termed methylene azure. In 1891 Ernst Malachowski[74] and Dmitri Leonidovich Romanowsky [75] independently developed techniques using a mixture of Eosin Y and modified methylene blue (methylene azure) that produced a surprising hue unattributable to either staining component: a beautiful, distinctive shade of purple.[76][77] Malachowski used alkali-treated methylene blue solutions and Romanowsky used methylene blue solution which were moulded or aged. This new method differentiated blood cells and demonstrated the nuclei of malarial parasites. Jenner in 1899 introduced methanol as a solvent for the dye precipitate.
Malachowski's staining technique is one of the most significant technical advances in the history of malaria.[78]
The existence of relapses was confirmed by Patrick Manson who allowed infected Anopheles mosquitoes to feed on his eldest son - Patrick Thurburn Manson.[79] The younger Manson then described a relapse nine months later after his apparent cure with quinine.[80]
Also in 1900 Amico Bignami and Giuseppe Bastianelli found that they could not infect an individual with blood containing only gametocytes.[81] The possibility of the existence of a chronic blood stage infection was proposed by Ross and Thompson in 1910.[82]
In 1903 Fritz Schaudinn erroneously reported direct infection of erythrocytes by infective sporozoites of P. vivax.[83] Schaudinn's error dominated scientific opinion for over forty years.
The existence of exoerythrocytic merogony of non-human malaria parasites in the internal organs was first demonstrated by Aragão in 1908.[84]
In 1920 Félix Mesnil and Émile Roubaud achieve the first experimental infection of chimpanzees with P. vivax.[85]
Three possible mechanisms of relapse were proposed by Marchoux in 1926 (i) parthenogenesis of macrogametocytes (ii) persistence of schizonts in small numbers in the blood where their multiplication is inhibited by immunity and this immunity disappears and/or (iii) reactivation of an encysted body in the blood.[86] James in 1931 based on the lack of activity of quinine on the sporozoites proposed that after being injected by the mosquito, the sporozoites are carried to internal organs, where they enter the reticuloendothelial cells and undergo a cycle of development.[87] Huff and Bloom in 1935 demonstrated the exoerythrocytic stages of avian malaria.[88] In 1945 Fairley et al. reported that inoculation of blood from a patient with P. vivax may fail to induce malaria in a susceptible recipient although the donor may subsequently develop overt malaria. The sporozoites disappeared from the blood stream within one hour and reappeared eight days later. This suggested that persistent tissue forms existed.[89] Using mosquitoes rather than blood Shute in 1946 described a similar phenomenon and proposes the existence of an 'x-body' or resting form.[90] The following year Sapero proposed that a link existed between a tissue stage not yet discovered in patients with malaria and the phenomenon of relapse.[91] Garnham in 1947 described exoerythrocytic schizogony in Hepatocystis (Plasmodium) kochi.[92] In the following year Shortt and Garnham described the liver stages of P. cynomolgi in the monkey.[93] In the same year a human volunteer consented to receive a massive dose of infected sporozoites of P. vivax and undergo a liver biopsy three months later thus allowing Shortt et al. to demonstrate the tissue stage of a human malarial parasite.[94] The tissue form of Plasmodium ovale was described in 1954 and that of P. malariae in 1960 in experimentally infected chimpanzees.
The latent or dormant liver form of the parasite (hypnozoite), responsible for the late relapses characteristic of P. vivax and P. ovale infections,[95] was observed in the 1980s.[58][96] The term hypnozoite was coined by Miles B. Markus, a PhD student in Imperial College, London. In 1976, he speculated: "If sporozoites of Isospora can behave in this fashion, then those of related Sporozoa, like malaria parasites, may have the ability to survive in the tissues in a similar way." He adopted the term "hypnozoite" for malaria in 1978 when he wrote in a little-known journal that this name would "... describe any dormant sporozoites or dormant, sporozoite-like stages in the life cycles of Plasmodium or other Haemosporina".[97]
In 1982 Krotoski et al report identification of P. vivax hypnozoites in liver cells of infected chimpanzees and in 1984 Mazier et al report in vitro cultivation of P. vivax liver stages in human hepatocytes. In 1989 chloroquine resistance in P. vivax is reported in Papua New Guinea.
In the early twentieth century, before antibiotics, patients with syphilis were intentionally infected with malaria to create a fever. In the 1920s Julius Wagner-Jauregg, a Viennese psychiatrist, began to treat neurosyphilitics with induced P. vivax malaria. Three or four bouts of fever were enough to kill the temperature-sensitive syphilis bacteria (Spirochaeta pallida also known as Treponema pallidum). P. vivax infections were terminated by quinine. By accurately controlling the fever with quinine, the effects of both syphilis and malaria could be minimized. Although some patients died from malaria, this was preferable to the almost-certain death from syphilis.[98] Therapeutic malaria opened up a wide field of chemotherapeutic research and was practised until 1950.[99] Wagner-Jauregg was awarded the 1927 Nobel Prize in Physiology or Medicine for his discovery of the therapeutic value of malaria inoculation in the treatment of dementia paralytica.[100]
Efforts to control the spread of malaria suffered a major setback in 1930. Entomologist Raymond Corbett Shannon discovered disease-bearing Anopheles gambiae mosquitoes living in Brazil, likely brought there by plane or fast mail steamer.[101] This species of mosquito is a particularly efficient vector for malaria and is native to Africa.[102] In 1938, the introduction of this new mosquito vector caused the greatest epidemic of malaria ever seen in the New World. However, complete eradication of A.gambiae from north-east Brazil and thus from the New World was achieved in 1940 by meticulous application of Paris Green to breeding places and of Pyrethrum spray-killing to adult resting places.[103]
Hans Andersag and colleagues synthesized and tested at the Elberfeld laboratories of the IG Farben (Germany) about 12000 different compounds and succeeded in producing Resochin® as substitutes for quinine in the 1930s (Dtsch.-Reichs-Pat. 683692);[104][105] it is chemically related to quinine through the possession of a quinoline nucleus and the dialkylaminoalkylamino side chain. Resochin (a RESOrcinate of a 4-aminoCHINoline) (7-chloro-4- 4- (diethylamino) - 1 - methylbutyl amino quinoline) and a similar compound Sontochin (3-methyl Resochin) were synthesized in 1934 in close cooperation with American companies.[106] There were over 2,000 cartel agreements between IG Farben and foreign firms — including Standard Oil of New Jersey, DuPont, Dow Chemical Company, and others in the United States.[107] In March 1946 the drug was officially named Chloroquine.[108] Chloroquine is an inhibitor of hemozoin production through biocrystallization and is one of the best antimicrobials ever developed. Quinine and chloroquine affect malarial parasites only at stages in their life cycle when the parasites are forming hematin-pigment (hemozoin) as a byproduct of hemoglobin degradation. The drug target of chloroquine is host-derived, which markedly delayed the emergence of resistance and it took P. falciparum 19 years to build resistance to chloroquine.[109] The first chloroquine-resistant strains were detected around the Cambodia‐Thailand border and in Colombia, in the 1950s.[110] These resistant strains spread rapidly, resulting in a large increase in mortality from this disease, particularly in Africa during the 1990s.[111]
Until the 1950s screening of anti malarial drugs was carried out on avian malaria. This was less than satisfactory as the avian malaria species differ in a number of ways from those that infect humans. The discovery in 1948 of Plasmodium berghei in wild rodents in the Congo[112] and later other rodent species that could infect laboratory rats transformed the tests used for drug development. The short hepatic phase and life cycle of these parasites made them extremely useful as animal models, a status they still retain.[58]
Plasmodium cynomolgi in Rhesus monkeys (Macaca mulatta) were used in the 1960s to test drugs active against P. vivax. This model is also still in use.
Genetically modified mice (non obese diabetic severe combined immunodeficient and BXN) can be engrafted with human stem cells and used as models for Plasmodium falciparum. Although the model is of variable reproducibility it has been used in some experiments.
Growth of the liver stages in animal free systems has been difficult but was achieved in the 1980s when P. berghei pre-erythrocytic stages was grown in wI38,a human embryonic lung cell line. This was followed by their growth in the human hepatoma line HepG2. Both P. falciparum and P. vivax have been grown in human liver cells; partial development of P. ovale in human liver cells has also been achieved; and P. malariae has been grown in chimpanzee and Aotus liver cells.
Systematic screening of traditional Chinese medical herbs was carried out by a number of Chinese research teams consisting of hundreds scientists in the 1960s and 1970s.[113] Qinghaosu, later named artemisinin, was cold extracted in a neutral milieu (pH 7.0) from the dried leaves of Artemisia annua inspired by Ge Hong's recommendation.[29][114]
Artemisinin was isolated by Tu Youyou, a Chinese pharmacologist. Tu headed a team of investigators who were tasked by the Chinese government with finding a treatment for choloroquine resistant malaria. Their work was known as Project 523 named after the date it was announced - May 23, 1967. The team investigated >2000 Chinese herb preparations and by 1971 had made 380 extracts from 200 herbs. An extract from qinghao (Artemisia annua) was effective but the results were variable. Tu reviewed the literature including the 340 BC book Zhou hou bei ji fang (A handbook of prescriptions for emergencies) by the Chinese physician Ge Hong. This book contained the only useful reference to the herb: "A handful of qinghao immersed with two litres of water, wring out the juice and drink it all." After making a non-toxic, neutral extract, Tu and two team members volunteered to take the extract before antimalarial trials were done in patients. The first studies were published in Chinese with the first English language paper citing successful trials for artemisinin appeared in 1979. The authors of this paper were anonymous according to Chinese custom at the time. Tu presented her findings to a United Nations scientific meeting in Beijing in 1981. Since then artemisinin has become a standard treatment for malaria.
Artemisinin is a sesquiterpene lactone containing a peroxide group, which is believed to be essential for its anti-malarial activity. Its derivatives, artesunate and artemether, have been used in clinics since 1987 for the treatment of drug-resistant and drug-sensitive malaria, in especially, cerebral malaria. These drugs are characterized by fast action, high efficacy and good tolerance. They kill the asexual forms of P. berghei and P. cynomolgi and have transmission-blocking activity.[115] In 1985, Zhou Yiqing and his team combined artemether and lumefantrine into a single tablet, which was registered as a new medicine in China in 1992, and later it became known as “Coartem”.[116] Artemisinin combination treatments (ACTs) are now widely used to treat uncomplicated falciparum malaria, but access to ACTs is still limited in most malaria-endemic countries and only a minority of the patients who need artemisinin-based combination treatments actually receive them.[117] Improved agricultural practices, selection of high-yielding hybrids, microbial production, and the development of synthetic peroxides will lower prices.[118][119]
Othmar Zeidler is credited with first synthesis of DDT (DichloroDiphenylTrichloroethane) in 1874.[120] The insecticidal properties of DDT were identified in 1939 by the chemist Paul Hermann Müller of the Swiss firm Geigy Pharmaceutical. For his discovery of the high efficiency of DDT as a contact poison against several arthropods he was awarded the Nobel Prize in Physiology or Medicine in 1948.[121] In the fall of 1942, samples of the chemical were acquired by the United States, Britain, and Germany and laboratory tests demonstrated that it was highly effective against insects. As the Rockefeller studies showed in Mexico, DDT remained effective for six to eight weeks if sprayed on the inside walls and ceilings of houses and other buildings.The first field test in which residual DDT was applied to the interior surfaces of all habitations and outbuildings was carried out in central Italy in the spring of 1944. The objective was to determine the residual effect of the spray upon anopheline density in the absence of other control measures. Spraying began in Castel Volturno and, after a few months, in the delta of the Tiber. The unprecedented effectiveness of the chemical was confirmed: the new insecticide was able to achieve the eradication of malaria through the eradication of mosquitoes. At the end of World War II a massive malaria control program based on DDT spraying was carried out in Italy. In Sardinia - the second largest island in the Mediterranean - between 1946 and 1951, the Rockefeller Foundation conducted a large-scale experiment to test the feasibility of the strategy of "species eradication" in an endemic malaria vector.[122] Malaria was effectively eliminated in the United States by the use of DDT in the National Malaria Eradication Program (1947–52). The concept of eradication prevailed in 1955 in the Eighth World Health Assembly: DDT was adopted as a primary tool in the fight against malaria.
DDT was banned in the US in 1972, after the discussion opened by the book of the American biologist Rachel Carson which launched the environmental movement in the West. The book catalogued the environmental impacts of the indiscriminate spraying of DDT and suggested that DDT and other pesticides may cause cancer and that their agricultural use was a threat to wildlife. Recently, the U.S. Congress, Republicans and Democrats alike, supports indoor DDT spraying as a vital component of any successful malaria control program, and the U.S. Agency for International Development has initiated DDT and other insecticide spraying programs in some poor tropical countries.[123]
A wide range of other insecticides is available for mosquito control in addition to the measures of draining of wetland breeding grounds and provision of better sanitation. Pyrethrum (Chrysanthemum [or Tanacetum ] cinerariaefolium) is an economically important source of natural insecticide. Pyrethrins attack the nervous systems of all insects. A few minutes after application the insect cannot move or fly away and female mosquitoes are inhibited from biting.[124] The use of pyrethrum in insecticide preparations dates back to Persia, about 400 BCE. Pyrethrins are non-persistent, being biodegradable and also breaking down easily on exposure to light. The majority of the world's supply of pyrethrin and Chrysanthemum cinerariaefolium comes from Kenya. The flower was first introduced into Kenya and the highlands of Eastern Africa during the late 1920s. The flowers of the plant are harvested shortly after blooming and are either dried and powdered or the oils within the flowers are extracted with solvents.
The first successful continuous malaria culture was established in 1976 by William Trager and James B. Jensen, which facilitated research into the molecular biology of the parasite and the development of new drugs substantially. By using increasing volumes of culture medium, one can grow P.falciparum to higher parasitemia (above 10%).[125][126]
The use of antigen-based malaria rapid diagnostic tests (RDTs) was pioneered in the 1980s.[127] Giemsa microscopy and RDTs represent the two diagnostics most likely to have the largest impact on malaria control today. Rapid diagnostic tests for malaria do not require any special equipment and offer the potential to extend accurate malaria diagnosis to areas when microscopy services are not available.[128][129]
Drug resistance poses a growing problem in the treatment of malaria in the 21st century, since resistance is now common against all classes of antimalarial drugs, with the exception of the artemisinins.[130] This situation has resulted in the treatment of resistant strains becoming increasingly dependent on this class of drugs. However, the artemisinins are expensive, which limits their use in the developing world.[118] Worrisome evidence is now emerging of malaria on the Cambodia-Thailand border that are resistant to combination therapies that include artemisinins, which raises the possibility that strains of malaria may have evolved that are untreatable with currently-available drugs.[131][132] Exposure of the parasite population to artemisinin monotherapies in subtherapeutic doses for over 30 years, and the availability of substandard artemisinins, have probably been the main driving force in the selection of the resistant phenotype in the region.[133]
The application of genomics to malaria research is now of central importance. With the sequencing of the three genomes of the malaria parasite P.falciparum, one of its vector Anopheles gambiae, and the human genome, the genetics of all three organisms in the malaria lifecycle can now be studied.[134] This breakthrough is expected to produce advances in the understanding of the interactions between the parasite and its human host—such as between virulence factors and the human immune system—as well as allowing the identification of the factors that restrict one species of parasite to one or a few species of mosquitoes. It is likely that these will eventually lead to new therapeutic approaches.[135][136] Another new application of genetic technology is the ability to produce genetically-modified mosquitoes that are unable to transmit malaria, allowing biological control of malaria transmission.[137]
The World Health Organization (WHO) recommends Indoor residual spraying as one of three primary means of malaria control, the others being use of insecticide-treated mosquito nets (ITNs) and prompt treatment of confirmed cases with artemisinin-based combination therapies (ACTs). In 2000, only 1.7 million (1.8%) African children living in stable malaria-endemic conditions were protected by an ITN. That number increased to 20.3 million (18.5%) African children using ITNs by 2007, leaving 89.6 million children unprotected.[138] An increased percentage of African households (31%) are estimated to own at least one ITN in 2008 (WHO World Malaria Report 2009). Most nets are impregnated with pyrethroids, a class of insecticides with particularly low toxicity. Dow AgroSciences developed a microencapsulated formulation of the organophosphate chlorpyrifos methyl as a cost-effective, long-lasting alternative to DDT. As an Indoor residual spraying against pyrethroid resistant mosquitoes chlorpyrifos methyl outperformed DDT and lambdacyhalothrin.[139] Organizations such as the Clinton Foundation continue to supply anti-malarial drugs to Africa and other affected areas; according to director Inder Singh, in 2011 more than 12 million individuals will be supplied with subsidized anti-malarial drugs.[140] Other organizations, such as Malaria No More continue distribution of more broad-based prophylaxis.